You can’t outsource responsibility
Abdulahi Ahmed uses his mobile phone to watch a video on social media app TikTok in the Waberi district of Mogadishu, Somalia, 21 August 2023. REUTERS/Feisal Omar
Content moderators are vital for tech platforms, but critics claim these workers aren’t given the care they deserve. In Africa, the courts have taken notice.
In May, 150 workers met in Nairobi to establish the first Africa Content Moderators Union (ACMU). This potentially has major ramifications for the world’s technology heavyweights, such as Meta and ByteDance, which own social media platforms Facebook and TikTok respectively. The ACMU includes current and former employees of the third-party outsourcing businesses that provide content moderation services for companies, including Meta.
This unionisation results from a backlash against the employment practices of the world’s tech leaders. Critics argue that the tech giants have used the legal status of workers as third-party employees to evade responsibility for providing them with liveable wages – the earnings of content moderators are among the lowest in the technology industry – or health benefits, in the face of the post-traumatic stress disorder (PTSD) caused by the graphic and distressing material they’re required to review.
A flashpoint for these concerns was the firing of Daniel Motaung – a South African Facebook moderator who was let go from Sama, an outsourcing company used at the time by Meta, in 2019. He subsequently sued Meta and Sama for PTSD caused by exposure to distressing material, claiming no adequate support, as well as unfair dismissal connected to his unionising efforts to create better working conditions for himself and his colleagues. Sama declined to comment on the case as it's ongoing.
Courts are starting to recognize that these social media platforms are the true employers of these individuals and should be held accountable under the rule of law
Rosa Curling
Director, Foxglove
Meta objected to its involvement in the litigation, arguing that it wasn’t Motaung’s employer and that it couldn’t be subjected to the Nairobi-based proceedings because it wasn’t registered – nor operated – in Kenya. Nonetheless, this February, in a decision seen as a milestone for accountability by campaigners, Kenya’s employment and labour relations court found that the tech company was a ‘proper party’ and ruled that the case could proceed. Meta has said it will appeal.
This case – which has paved the way for other litigation as well as the formation of the ACMU – was brought by Kenyan law firm Nzili & Sumbi Advocates in partnership with Foxglove, a UK-headquartered group that fights for ‘tech justice’. Foxglove was founded by lawyer, investigator and activist Cori Crider, her former Reprieve colleague Martha Dark and public lawyer Rosa Curling, formerly of law firm Leigh Day.
Since 2019, Foxglove has shone a spotlight on important issues such as the UK Home Office’s use of algorithms to determine visa decisions, which in 2020 prompted the government to suspend its system, pending a review of alleged ‘discrimination’. Foxglove has been a thorn in the side of the Big Tech sector, challenging its practices and alleged lack of transparency.
The team’s involvement in Motaung’s case stemmed from Crider’s conversations with moderators about the damaging impact of their jobs on their mental health, and the lack of support and legal safeguards they received from those outsourcing them and the companies using them. ‘We felt like the power of these big companies was going unchecked’, says Curling, a Director at Foxglove. ‘They were doing a lot of “internal reflection”', she adds wryly, but ‘bluntly, no one was suing them’.
She believes that taking the sector heavyweights to court is currently the best way to force them to follow the rule of law, ‘because they have so many resources, they genuinely feel like they’re above it’, she says. In particular, while other campaigners have gone down the privacy route, which Curling also supports, Foxglove felt that ‘what was missing was challenging [tech companies’] power from a labour rights angle’.
What became clear to Foxglove were what Curling describes as the ‘appallingly exploitative and abusive working conditions’, which Meta ‘tries to distance itself from by partnering with outsourcing companies and pretending it’s nothing to do with them’. But the reality is, she says, that content moderators are at the heart of the company’s search engine optimisation policy. It’s ‘core to their business model’, she says. ‘And their entire working conditions are set by software designed by Meta.’ Meta didn’t respond to Global Insight’s request for comment but has previously stated that it requires partners ‘to provide industry-leading pay, benefits, and support’.
But it’s not just Meta, Curling says. ‘It’s TikTok, it’s Google. All the Big Tech social platforms follow the same model’, she says. TikTok, for instance, turns to a company called Majorel in Kenya. Curling includes lack of time away from screens and an absence of ‘proper psychiatric and psychological support’ among the failings of many of the big-name tech companies operating in this space. ByteDance – TikTok’s owner – declined to comment when contacted by Global Insight while Google and Majorel didn’t respond to requests for comment.
Sama has challenged its portrayal by its critics. In a statement on its website, updated earlier this year, it explains that the company ended its work in content moderation in March. It says it had ‘paid employees wages that are consistently three times the minimum wage and two times the living wage in Kenya’, with an average 2.2x increase during 2022, and that employees receive healthcare and other benefits, as well as mandatory wellness breaks and one-to-one counselling. Going forward, Sama would be conducting audits – including by third parties – relating to pay, wellness and working conditions.
Part of the challenge, Curling says, of kickstarting change within the employment culture of Big Tech companies is the practice of requiring employees and third-party workers to sign non-disclosure agreements. These ‘make them feel like they are going to be sacked if they’re critical [of their employer]’. This is particularly egregious, she says, in light of the lack of support that the people doing this ‘really skilled job’ tend to receive. In the same vein, she argues that moderation is critically under-resourced in regions where it has real-world impact.
Curling points to incriminating details revealed in the ‘Facebook Papers’ – a trove of internal documents provided to the US Securities and Exchange Commission by a former product manager at Facebook turned whistleblower, Frances Haugen, in 2021. What really stood out, she says, was a 2020 company summary showing that the vast majority of the resources Facebook had dedicated to removing misinformation – 83 per cent in total – were focused on the US. In contrast, just 17 per cent of resources were allocated to the ‘Rest of World’.
Foxglove is supporting a second case, in Kenya’s High Court, which is also being argued by Nzili & Sumbi Advocates. Brought on behalf of two Ethiopian researchers and a Kenyan civil rights group, it alleges that Facebook helped fuel ethnic violence in Ethiopia’s civil war by poorly moderating hateful, dangerous content and then failing to remove it quickly enough. ‘For a country of 117 million people, they’ve got 25 moderators who speak three of the 85 languages’, says Curling.
Exacerbating this, she adds, is that the algorithm also used by Meta to moderate Facebook content ‘hasn’t been trained in any of these languages’. As a result, ‘the platform becomes awash with incitements to violence’. And the lawsuit argues this isn’t a geography-blind problem. It alleges that Facebook treated users in African countries differently to those in the West, fostering a ‘culture of disregard for human rights’ that ultimately led to the murder of a plaintiff’s father.
Neither Facebook or Meta responded to Global Insight’s direct requests for comment on the above, but Meta has stated that hate speech and incitement to violence are against the platform’s rules and that it has invested heavily in moderation and technology to counter hate. It added that ‘we employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in Ethiopia’.
The battle to curb the discriminatory practices that Foxglove sees as endemic to Big Tech is unlikely to end soon. Earlier this year in a third case in Kenya, 184 content moderators sued Sama for unfair dismissal when they were let go after its contract with Meta ended and it ceased providing content moderation services. In June, a Kenyan court suspended the sacking of the moderators, ordered Meta to provide ‘medical, psychiatric and psychological care’ for the claimants and barred its new outsourcing provider, Majorel, from blacklisting them from applying for the same roles. Meta said it would appeal the ruling.
In August, the Court subsequently stipulated that all the parties – Majorel, Meta, the moderators and Sama – would enter mediation to resolve the dispute. Sama told Global Insight that it can't comment on the case as it's ongoing but that it's ‘pleased to be in mediation and believes it is in the best interest of all parties to come to an amicable resolution’. No settlement has currently been announced. Majorel and Meta did not respond to requests for comment on this case.
Ultimately, for Curling, the crux of the matter is that ‘all these companies have the money to figure out how to keep the impact on these people of their work to an absolute minimum, and to enable them to do their role well’. She expects to fight the Motaung case ‘all the way to the Kenya Supreme Court’ – but is happy with their success so far. ‘I hope there’s a bit of a tide of change’, she adds. ‘Courts are starting to recognise that these social media platforms are the true employers of these individuals and should be held accountable under the rule of law.’
And unions such as the recently formed ACMU have an ‘incredibly important role to play’ in re-writing the power relationship between the Big Tech companies and those responsible for moderating their content, she says. ‘The people who are doing this work are crucial to companies like Meta’, she says. ‘Imagine if they all went on strike for a week.’
Tom Wicker is a freelance journalist and can be contacted at tomw@tomwicker.org